About this Show

Book TV

Ben Goldacre Education. (2013) 'Ben Pharma How Drug Companies Mislead Doctors and Harm Patients.'




San Francisco, CA, USA

Comcast Cable

Channel 17 (141 MHz)






Us 11, Academia 4, U.s. 3, America 3, Ahold 2, Washington 2, Ama 1, Altavista 1, Tweet Us 1, The Data 1, The Universe 1, Us An E-mail 1, Fda 1, A.j. Couldly 1, Archer 1, Tom Jefferson 1, Sweetnam 1, Mindbending 1, The Final Nail 1, Seattle 1,
Borrow a DVD
of this show
  CSPAN    Book TV    Ben Goldacre  Education.  (2013) 'Ben Pharma How  
   Drug Companies Mislead Doctors and Harm Patients.'  

    March 23, 2013
    4:30 - 5:59pm EDT  

it features over forty venders and exhibiters, a children educational area, and 5 authorities and -- 45 authors and poets scheduled. booktv will be live from the los angeles times festival of books. checkbook tv.org for live coverage. let us now about your area. we'll post them. post them to our wall or e-mail us. for me something so right, dear, so necessary before we got in trouble as students as young people we studied. we just didn't wake up one morning and said we're going sit in. we didn't just dream one day that we're going to come to washington and go on the freedom
ride or march on washington like 1963 we were going march as we did in 1965. we studied. we prepared ourself. >> intimidated so many people white people in particular by using that phrase. black power. because when they use the word or the phrase black power it made many think that black power meant destruction. blowing up the statute liberty or ground zero. destroying america. it wasn't anything about destroying america. it was about rebuilding america and having america to have a new paradigm in terms of how we can truly be with each and every one of us doing the pledge we were going elementary school and high school land of the free and home of the brave. they discuss their personal experiences during the civil rights movement. live from the virginia festival of the book.
tonight at 8:00 eastern. part of booktv this weekend on c-span2. up next on booktv physician and science writer talking about the influence of the pharmaceutical industry. he argues that pharmaceutical companies hide negative studies and use expensive lobbying to get what they want. the event from seattle's town hall lasts about ninety minutes. [applause] thank you. app fair dislow sure. i'm hoping it's aer in i did nerdy crowd -- [cheering and applause] you are my people. [laughter] there's no reader's health advice here. i'm not going tell you how to get the best out of the doctor. there are no idle conspiracy
theories how drug companies are trying to kill us. it's a story about flaws in how we dwat gather evidence in medicine. i think the technical flaws in important technical process very well documented in the medical academic professional literature what i'm hope dog is share that more broadly with the public. in particular because there's several very well documented problems which we have failed as a profession to fix. and so i think we need the help more than anything else of the public. it's sort of a -- mass they are people like nerds and lawyers and doctors and policy people and the rest. so -- we're going talk about problems in three domain. we'll talk about how randomized control when we like to automatically fair tests are in
fact all not created alike. there are structure design floors so there are no locker fair tests of which treatment is best. we'll talk about marketing the boring part. we'll talk about the problem of missing child data. it's unspoken dpirty secret, if you like, that half of all the clinical trials conducted and completed for the treatment we use today from our best possible estimate have not been published and -- [inaudible] as trials with negative results. now this has a huge impact on the information that we use to make informed decisions as doctors and patients about which treatment is best. but i'd like to show you how it also has an impact even earlier
on in the process of developing medicine. so this is a -- a lot of you hipsters adopt know what photocopiers are. [laughter] a very long time ago in the dawn of grooviness they were like photocopied magazine you have for like the punk scene or music or art or fashion, and this is the fun for phase one clinical participates. when you just developed a new molecule, you first of all trial it in six health young men to make sure that nothing horrible happens to them. there were volunteers from low wages has anyone been a phase one? often postgraduate students do it better money.
there's a weared scene in some towns of people living slightly -- they become full-time begin any pigs. doctors like to think of it as precise and clinical. there's a real practical culture to this stuff as well. and in the pig's era they talk about bleeding work and they give site report and marks in ten different place. they talk about unionizing as begin any bigs to get better condition. they talk about the history of phase one trial participates and human pigs and self-exirmtation. it's fan fantastic. the drugs are hitting the boys over the girls. it's like over-the-counter peed. i did that in an welsh accent. it's the world of face one clinical trial. this used to be a plumper he was a participate ant participant in
the u.k. it went wrong. so there was six young men and they were give a new biological, it was a new kind of intervention, really. when they were first given it everything was fine. suddenly very rapidly things began to go downhill. they ended up on intensive care. they were ventilatedded and dialized. the blood was replaced and the circulation began to shut down. the fingers and toes began to fall off. these pictures from the newspaper coverage at the time. now mistakes happen in medicine and what is important is to disting, i between avoidable and unavoidable mistakes. in the u.k. government incare i are in to what happened there
were two observations made. the first has been developed. they said if you're giving new intervention with a completely unpredictable mode of action and very unpredictable then don't give it to all six people at one. stagger it, do it every three, six, twelve you wouldn't want to go first. i'm not participating in phase one clinical trial. i don't unhow anyone can have that kind of relationship with risk. but that has been active. the second recommendation has not. it turns out in some respect this was foreseeable not entirely, there was an extra ground for concern because a similar intervention had been tried in one person, this had adverse outcome. it was congruently what we saw. but the research is working had no idea it happened because the
results of that phase one trial hadn't been shared with the research community. and so the recommendation of the u.k. government inquiry was that the result should be dissemnating. it hasn't happened. today only one in ten phase one clinical trial is published within a year and only one in five is published within five years. the problem of the trial d.a. goes just phase one trial and goes beyond just posing a risk to phase one trial participates. this is a molecule called -- [inaudible] itit prevents abnormal heart rhythm. now in the 1980s there was a vowing for using these more freely than today. we use them in broader category. everybody who an heart attack arrhythmia -- just had a heart attack was likely to get given
that drug. now we knew that these drugs stopped abnormal hearts. if you had abnormal heart rhythm. they made them go away. we have seen those ared at increase risk of cardiac death. that was a good thing. we relied on the fact it it was getting rid of the abnormal heart rhythm. it thought it was going stop them from dying. it turned out it was wrong. when we did the cast trial, they reported the findings in 1992 we discovered that patients given the drugs for the particular kinds of abnormal heart rhythms after having a heart attack was increased risk of dying. was an interesting thing to grasp. nobody was being battled here. what were being sloppy and not thinking. we were assuming an intervention on a positive income that
successfully stopped people having abnormal heart rhythm was necessarily going have a beneficial impact on are a real world outcome which is death. but we were wrong. and because this prescribing practice was so wide spread so common because the drug was used so frequently it turned out some people estimated that cause the unnecessary potentially avolleyball death -- avoidable death of around 130,000 u.s. citizens before we gathered real world outcome d.a. this is a little bit mean. i don't want you to think of horrific imperialists. [laughter] sometimes there are academic papers which say 130,000 which is only true of the vietnam war. which is only true if you count death on the american side. to what extent was it
foreseeable? it was an important wake-up call for doctors to make sure we don't rely on process outcome and surrogate outcome. but interestingly there had been an earlier study conducted in 1980, before any happened, which hasn't been published. it's a remarkable paper. it's a -- a.j. couldly writing in 1994 after the trial came out after we found out this. and he said when we carry out the study we saw the increased death rate was an effective chance. the a member of the drug it wasn't one of the one use in the prescribing practices in the 80s. he's being optimistic.
early warning of trouble ahead. again, this understatement by trouble ahead he means the potentially avoidable death of an estimated 130,000 u.s. citizens. is this unusual? well, no.
and it has an impact on the every day treatment that we make, the overred decisions we make about treatment in medicine all of the time. to understand why missing clib call trial data is important you need to understand the -- how we use evidence as doctors to make decisions. you need to understand about something call a system attic review. it's kind of embarrassing to medicine it was inconveniented so recently. [laughter] up until the '80s, if you wanted to write a book chapter for a medical textbook or review article about the treatment of, you know,let say treatment of depression in patients with physical illness who were also in hospital. you might go, okay, i can remember a few trials that i've read about in the past, i'll keep my eyes open over the next nine months until the paper is due whenever i come across a
thrill is relevant i'll photocopy it and keep it in a bock. your friends would go, hey, you're writing that chapter. in a systemic way, much like a 19-year-old student might randomly throw an essay together. you would produce the medical textbook article that would be the definitive story of treatment. also perhaps even to deliberate cherry picking if you will a favorite idea about what you thought was right answer. in the 1980s people finally got a grip it wasn't quite right. we invented something called a systemic review. it's a piece of science how you look. you describe exactly how you search for your evidence, so you say what data bases you go. you say exactly what the terms you type in. you say what do you use to
randomize trial. randomize control trial. you type in pray see bow or whatever and you get a most systemic and complete summary of all of the evidence that can possibly be drawn together extract all of the numbers from that from the main effect what the treatment did you put in one spread sheet and do something where you add the numbers together. you get a plot. it's collaboration it's a global non-profit collaboration. they produce systemic review gold standard summary used by doctors worldwide to make treatment decisions. now would you like me to explain what it is and how it works. [inaudible] right. okay. so each of these lines is one trial, and if the line of the
trial is further to the left. it shows a benefit. if itst further to the right the treatment was harmful. it's the hornet l line it goes up and down it means it showed no statistically significant benefit. now you see some of these trials are narrow was they are big trial. a big trial is less von initial. some are broad that's because they are small trial. much more visual to the play of chance. you have broader estimate. now these are real world trial. okay inspect is the first systemic review. it's on a really important question, it's on the question of if you give a steroid injection to a mother about to deliver a premature baby does it stop the, by the way, from dying. it's important stuff. we call this the baby graph. this is great challenge of evidence based. and teaching epidemiologist you
have to kind of force people to really feel in the belly the connection between the abstraction of evidence and all of the kind of method logical quirk i'm describing to you. you center to feel. your belly there's a connection between that and the real world of suffering and pain flesh and blood and death. it's the meat of medicine. so what we can see here is some of these trials show the treatment was beneficial. one there, one there, not very accurate. it shows that the steroid injection save lives. some of the trial show there was no benefit. when you add them up together you get here that's the summary effect size and the overall the beneficial extreme. the overall that saved lives. why it matters, okay, is all of
these existed for many years. even though if you add them up together there were many doctors who didn't use steroids when mothers were about to deliver them. some doctors say there are two -- it looks useful. other doctors say there's are four negative and why should i expose people to unnecessary side effect. i'm not using it. you must be crazy. was of that, because there was many doctors who didn't use steroids even though the evidence was there, babies died. and babies died not because we didn't have the evidence, because we had all of the trials. babies died becauses doctor of the stage were savvy enough to realize the importance of synthesizing the evidence we have in one place to get the right answer. i think that's extraordinary.
the flaw in what we might have to call the information architecture of evidence-based medicine meant that babies died. we have the information we didn't bring it together in a way that meant we could make informed decisions. now you get importance of evidence synthesis now. i shouted about dead babies. [laughter] so this whole project, right, of bringing together all of the evidence in one place to get the best closest most accurate estimate we of the benefit and risk of the treatment. this breaks, if we don't have all of the evidence. if some of the negative trials are mess missing we -- we might say, look you are getting horrible side effect. it's effective. you must stay on it. it's a positive trial happen to be missing we ten wait the
benefit and miss out on using it. we have to have all of the study, okay. we know that is not what happened. and we demonstrate the trials have gone missing in action. negative studies have been withheld in two different ways. stories which is the -- communicating to lay yawed yens or -- audience or statistic which is my own aggressive invention. i am -- [inaudible] [laughter] in is a plot. it's a positive way of spotting negative trials have gone missing. it relies on an interesting and clever phenomena. firstly, it's easier for a small child to disappear than a massive control trial being conducted in thirty didn't. hospitals in ten countriesism employing 1808 different people with a total cost of $100 million. so shawl trials easier for those to be brushed under the carpet.
each of these dots is one trial. it a dot is further over here it shows it was beneficial. over here it shows it did harm. in the middle it's normal. as you go from bottom to top you get bigger trial at the top and smaller at the bottom. okay. you probably already see that up at the top a lot of big trials and they give you an accurate estimate of what the treatment size is. that's why we like big survey and survey of 23 people, right we want big survey we think they give more vulnerable to random error. should cluster around the true effect of the treatment. as come to the smaller study they are spread out. more randommer rather. over here to the right you have a one -- and some negative once. purely from the play of chance. now if there's is publication
that means that negative studies will be missing from the picture. but it won't be the big ones. it liberty smaller ones. and so there will be dots missing. where will they be missing? will they be missing here? no. [laughter] will they be missing here? no. [laughter] will they be missing here? interesting. okay. so this is a funnel plot, you can see here they seem or it there. over here you can down smairl positive study here when you expect the smaller study to be heir tholt there. t regetion line popped true them. it's not statistically significant deafuation from the funnel shape you would expect and this is only a subset of study but it is a funnel plot looking forest fire -- i think it's the funniest
epidemiology joke you will ever hear. [laughter] you can also demonstrate the presence, you can show that this is with the story. this is a drug i prescribed. patients don't get better on one treatment. it might be better to try something different. i had a quick look at the literature. i found couple of trials. it's just as good. i signed it giving it to a patient. in 2010, a paper was published in -- the german government cost effectivenesses agency. they described and extraordinary battle said we will not look at the drug.
we won't look at the drug until you give us all of the trial you have done on it. and they said no, they said, well, we're not going have anything to do with the medicine. give us the trial. they got ahold of them what they found there was three studies comparing what we have seen placebo found it was beneficial. there were five more studying shows it was know better than a plea see placebo. they were published but data collected three times as many patients. showing it was worse than other antidepress it. i did everything i was supposed to before i subscribed it. i read the trial. and concluded they were well designed. i was overall mislead. when youd add up the data.
it's by some people's estimate completely use legislation. it only expose z side effect. at best it's margely beneficial. that's not enough. it's an important thing we have to grasp here. i'm not saying there are drugs on the market which are actively harmful. i think that's unusual. the problem we have when half of the evidence is -- we don't know which treatment is best. when you get an inferior you are being deprived of a better treatment. it's different things for people to grasp. so most treatments in medicine we have to recognize to a huge amount of good. -- because of garage l
accumulation. now let's say there is one treatment eight lives out of which saves six lives out of a hundred. if you can get people to use the treatment, which will only save lives out of a hundred, you reward a yourself a point. you have given people what is an effective treatment. if they are using that made it look as a better than eight lives out of 100. the net seskt down two out of a hurnld. i think the difficult thing here is that we have put the failure of ambition we haven't noticed -- audiocassette. that's an important new one. i'm not some loon any running around saying stop taking your pills. they're going kill you. what i'm saying is the flaw until the information are
techture are such that we cannot be certain what we think is best is necessarily the best. so we have the room until 11:00 tonight. [laughter] is it unusual? turns out the an is no. as i said, the best currently available evidence comes from a -- there's no opportunity for cherry picking. it's a summary of all the study that have been done look forking evidence and publication published in twern it's current and shows on average around half of the trial that conducted in complete don't go on to get published and trials csh twice as likely to be published as trial with negative as a result. to give you an illustration, if you want to find out if trials disappeared if you look at they positive or negative result. you need a list of trials conducted and completed. you can get it in different
ways. you use document. it's an interesting paper where they took all of the trial that the fda held on twelve antidepressant. these are basically all of thement you've heard of that have been approved over the course of just under a decade. they got all of the trial. getting all of the trial still some a distant dream. representative example of trial because it's all of the trials done before a drug approved before it don't the that we used in order to get approval from the fda. it's a real pain to get ahold of this. you center to use the freedom of information act. so you to wait while you wait. you have to get you applied.
i only mention this because i had dinner with man did the paper yesterday. so finally all of the trial here and in the document and the story of what happened there was 74 trials overall. 30 positive and 30 negative result. they looked in the journal to see what was there. they found markedly different. overall, three negative results and 38 -- in fact it's worse than that. several of the negative got published if they had positive results using all kinds of data techniques. we have the room until 1:00 in the morning. [laughter] ..
>> we would rightfully agree that you do this research. this is correct. i tell my components how they are. what i find fascinating and we have this strange cultural blind spot in academia. for some reason it is considered the whole study done. even though we know that when that happens, the overall impact on the overall appearance with a real world of flesh and blood and suffering and pain and death, right? for some reason we do not regard that as research misconduct.
no medical or professional bodies in the world, it's a very strange oversight. you know, this is not a story about, you know, obviously people trying to make money. this is about people -- it is freud in a culture of medicine. something that we have known about for 30 years and we have failed to fix. worse than that, we will people into a false sense of security. i will tell you that as well. you have come out and your accusative doing things with dogs or whatever it is these days or whatever.
in which they said that we will never again publish a clinical trial unless it is being hosted on a publicly acceptable register. because they said, okay. if they want to publish it, they want to get it in high-impact journalism. therefore, if we have that carrot. if we are the evidence of the medical journal, we can tell people that you're not allowed allowed to do so into you registered animal beforehand. but if you folks visit with the existing spec johns, that doesn't get the results. but it creates the opportunity where we can see what is being conducted and then we can go on and say, at least we have charm of spotting what's missing. everything is fantastic, then everything is fixed.
in the and secretary board of medical journalism. and then before you get them together, you say well, the paper -- and, it is the end. i can tell you that -- what that? [laughter] [inaudible] [laughter] it is a hard line. i have tried a joke. [laughter] so years after this, the regulations have been put into place. five years later, we finally discovered that it has been ludicrously and widely ignored.
published in the top five journals. that is just the papers that were published. academic journalists themselves have a whole bunch of complex interests. they connect cultural things as well. so that will bump up the impact factor of that journal that is derived by the number of citations a get. and it's not even often a full-time job being an academic journal editor. and they pay lots of money for reprints, but when they posted review is published in an academic medical journal, drug company sales representatives by reprints of these for five or 10 or $20 a pop. and they say, they give those to doctors in order to encourage
them use that treatment. so that means that academic journals have a huge interest. so who here has ever published in one of those. okay, great. when you do that, i'm not asking you what your complex are, but what kind of stuff does one have to write in their? >> [inaudible] >> yes, how much money you get from your talks on whether you are funded for your research or not. so last year i published in the british medical journal we said okay, well, could you tell us how many reprint orders have you got. and they said well, no, that is confidential information. new england journal when given a choice either. ama wouldn't give it to us. so what i find interesting is that they are perfectionist about making sure that you'll
get a clear declaration when you publish in their journal. and yet, when we asked him a simple direct question, you know, we are interested in this process and they say that that is confidential. it is just interesting. anyway. [laughter] so the second big fix is the ftm fta amendment act of 2007. that law said that you have to post in a clinical trial results within one year of completion. so there is a website called chemical trial's outcome. you have to pay a 10,000-dollar a day find which would be a lock for many people. the $3.5 million a year would be a parking ticket if you are a big organization that is making
billions of dollars. no routine audit again. when this was conducted and published in 2012, the road to compliance with generally astonishing. everybody knows that you have to post everything within one year. the compliance as last year was 22%. four out of five trouser ignored. the amendment, which is the single thing that is most commonly cited by people who want to blow off any of these concerns. 22%. yet no fine has ever been
applied to the styles. but everyone is pretending to just do it. but even more preposterous than not, even if there had been uniformly perfect and adhered to, this law has done nothing to improve the evidence that we practice today. about 80 to 85% of the treatments we prescribed a medicine today came on the market more than 10 years ago. fixing everything, that might have an impact on medicine in the year 2029. but it isn't what to do anything for us now. what we need is access retrospectively to all the study reports. and we need to information on the medicines that we are using today. it's all in dry storage archives and nuclear bunkers and
whatever. [laughter] so as you can see, this is a very long and detailed story. long story which i'm not going to tell you. [laughter] only because -- i just said, you know, you can get so embedded in a set of problems. that you sort of start to lose your orientation and what is normal. my wife and friends would say, yes, this is a good example of that. so this was sent by the european medicines agency about these weight-loss drugs, we are that worried, can you tell us what you hold holder. and they sent us. it is a roomful of people,
everyday people, the moment that they see that, they spontaneously whack. right? what was going through their heads? you know, did anyone think, well, this is a very fair wind. in my job has been a bit funny lately. a strange business. it is interesting because the european medical agency will now give you what they hold on request. they invite you to award them a point for this, as if they gave this freely and fabulously out of their own generosity. but the reality is they have the most damning of findings. they failed to give a coherent or even a consistent account.
but apparently we have to comply with this lady. very finally, the final nail in the coffin to pretend that there is stuff being fixed, they only hold study reports for the things that they approve and they haven't done so for the last two years. it's no good, redid this will evidence that comes ladin last four years. so finally, the camera flu -- tamaflu jug. it is a major medical emergency, you don't want to talk about complications like that.
so there are 14,000 academics making informed decisions about which treatment works best. they were asked to update the review and they got a touch agitated. and i blame you, actually. [laughter] >> really, seriously, we are like a fifth of the way through. [laughter] so they got in touch with a company. they said that we would like the full study of the clinical reports. because we just have to realize why it is important. in the brief summaries of clinical trials that you get from a regulation filings and even an academic journal
articles. incomplete, inaccurate, some of these are misleading. it was measured and analyzed and so on. in december of 2009, still three years and three months later, we failed to do so. that is 5% of the uk and all drug budget. it is absolutely mindbending. sonata for about a third of the way through my talk -- there is a choose your own adventure component. now why? it is 20 after 8:00 a.m.? okay. we do marketing if i type
extremely fast. would you like to ask questions? >> archer? okay. i will try to speak a bit fast. [laughter] if i do it in an american accent, is that easier? >> hello. would like to imagine that they are immaculate. and this is a really important thing. it is an important background for you, which is missing
chemical trial data. there is no cherry picking their and there is no anecdote there. the single stories that i gave you were illustrations of a wider phenomenon. it's not right? and that was evidence two by systematic reviews of dozens of studies. okay, anyone who says that is cherry picking is a giant mass of fire with their appearance usually on fire. [laughter] i'm not saying that these design flaws are hugely prevalent. but some of them happen when some have for months. so you can compare it against something that is rubbish. there are lots of trials from about 60 years ago about these new generation of psychotic
drugs. so those people in the old classification had higher dosage you can also compare against nothing. and i don't mean that people necessarily use it, but it is important to get over the fantasies and transparency that some people have. what you are interested in is the treatment that we are to have right now is a better than that. that is the first decision you are trying to make. 30% of those had only been compared against the placebo even though there wasn't current
data that showed effective treatment for those conditions. next, so you roughly get this. you have to guess where the ball is. okay? you put down the dollar. if you guess where the ball is coming you get $3 back. how many cups are you allowed to look under in the same? well, one of them. okay, if you look under two cups, i'm not going to give you $3. so do you get better in one way or another? well, first i might say that i'm going to measure this. so i'm going to measure the depression and anxiety scale.
and maybe i'm going to use this general health questionnaire and before you know i have 12 different things. if i measure 12 different things in a study, right? and i allow myself to say that if any one of them shows a significant benefit at the end of the study, that means the treatment was a success. but if you measure 12 different things and are cut off his back, you have a 5050 chance even if it is just statistical noise in the data. finding a benefit even when there is nothing happening at all. the academic journals are somehow viewed as the gatekeepers of wisdom. but the primary outcome is completely different compared to the outcome of the published
papers. it is often really taken down to these common pages in the news. you don't. yet people fail to catch the most basic things. so next, okay. you can run a trial and this is a very interesting one. it shows that there are flaws and invulnerability is to make us one of altavista. we have made trials inappropriately expensive. we have made them this particular exception and medicine. people are desperate to do everything they can to find a benefit. when we can do it is by trusting an ideal patient population.
people that are younger, healthier, people who are on no other drugs and i'm no other medical problems in a much less chance of showing a benefit. once you have done this test in this perfect population, we say are the results of this trial really applicable to my population. consisting of a lady in her 70s it is actually working here it is so preposterous on a scale of the society between the world of the trial and the reality they took out 179
patients from family doctors in the uk who had to of have known whose treatment was being managed. the best practice guidelines are the decision trees that were used. they are informed. we have real-world patients were being treated in accordance with the results. and what proportion of them would've been eligible to participate in a couple of these trials. 6%. it is not difficult for five people to do trials on him. and yet, these trials are being conducted in this way. it is on page 172 of the book. [laughter] okay, going into this, seeding trials, you have to salute this. this is your word. [laughter]
so you are proving you want to do a trial in a real-world population of 2000 patients. so is a migraine, and a real problem? it is common. so in outpatient, how many migraine cases are they going to see in a year. hundreds? maybe a thousand? this is bread-and-butter, right? so you treat 2000 migraine patients. every time you have to go through this clinical practice training and getting everybody up to speed on the codes of conduct. so the rational way is each recruiting 500 patients. was something around that. so what we think when we see a child that recruits to patients in each of the thousand.
and this is a trial that is fantastic. because you dare not say just because you see a peculiar pattern of recruitment, no, no, someone will say this is horrible and you need to say that we are just recruiting boldly with a population. it is probably nonsense but if you have access to each court documents, and this is why companies are so desperate to settle out of court. because the cord has a bit of a headline problem. once you get access to documents, you get people saying, well this is interesting. and so the idea of a seating chart is not promoted to find
out what treatment does. it is to get others familiar with the trial. the thing that i love so much is just imagining that imagining the doctors saying, okay, you know, you should try this. we have a lot of regulars on this trial. and you are thinking you are an embarrassing human being. [laughter] lastly marketing. a perfect example of this recurring theme in this interlocking ecosystem. all about people and their vulnerabilities in the system and marketing shouldn't matter. it is only an issue because we
have evidence and the decision-making of monitoring habits and making sure that we have a rapid acceleration of treatment. also rational investment of things that don't work. we are incredibly bad at it. if you put me in charge of the government and medical research budget for the universe, i can carry this and we can live without a clinical trial comparison. just one year, i would cancel all trials and spend every penny that we have on creating a better infrastructure to get it out to the decision-makers. because currently we spend
hundreds of millions of dollars on just one trial. completely dropped the ball. we get describing advisors, pharmacy project, reviews on geriatric patients in such and it is hopeless. because we don't do that stuff, we are vulnerable to marketing. i cannot blame the decision-makers about what works. i just think that we should be above it. so this person is famous for having a lovely face and getting breast implants. her name is jordan.
and so you probably prefer that there was a ghostwriter involved in the creation of this work. her name is jordan. now, the prevalence is hard to spot. there are shades of gray. so they will take it and take it to another medical journal and they will say would you like to put your name on this and they will handle the fiddle and the fuss. and you get your name in the
journal. sometimes they handle it with cash as a way to sweetnam. sometimes it is not declared very flamboyantly and publicly. in the name isn't exactly in light. it is in a kind of mention of a company somewhere. sometimes it is completely hidden. because i see the heinousness of this first it distorts the content of the relationship. a bunch of people with money to make were able to spend a lot of time and effort on getting their message to their part of the discourse that bulges out. and there is a second concern that i think is almost a bigger one around the medical records. who works in academia.
okay, how do you get ahead in academia? you publish. okay. so you get academic publications and respected men and women and so on. what is interesting is where people are arguing this getting lots of papers out. the people are being selectively propelled up the hierarchy in academic medicine. we are selectively promoting people that are most willing to participate. and we could regard it as a slightly moral gray area. [laughter] we are promoting these people up specifically into positions in which they accumulate the appearance of this. it is bizarre and peculiar and just like publishing a paper.
is that an easy or difficult thing to do in an academic journal? difficult business. if someone is giving that you then they are going to see that you are great. that is better than getting $2000. right now, i would probably give you $2000 grant. maybe i should employ a commercial medical writing company. i don't know. so just a few years ago, general practitioners at tens of thousands of documents. it turns out that there were reprints of how fantastic work was. and there are blatant concerns about this here. when it came out he said these were not presented in the academic journal. and this is a lie.
[laughter] terrible. it is ridiculous. so drug company sales reps talk to doctors and try to convince him to monitor what they have described in the u.s. they give you examples, they give you cherry picked information. but of course, it is not entirely brilliant dissemination of evidence. suppose i can remember facts, but i'm not sure that four years after i learned that come i can necessarily trust myself to have slightly distorted pictures with the risk of treatments whispered into my ear. i'm just not going to get involved in that whole circuit and this comes from a systematic review published in 2011.
so doctors who see drug reps overall are less likely to follow best practice guidelines so why would people spend billions of dollars on the stock? well, of course it works. so they get into banning stuff but we should be able to talk about it and we should be saying to doctors that are seeing the drug representatives, are you sure. and then finally another fantastic illustration of how people go on with what is acceptable and what is not. in their overall process that creates dissemination. you qualify for doctorate in
your 20s, you step out as a specialist in your 30s and many work independently for the next three or four years. and then he will go through changes. and they basically self-taught for all that stuff. doctors are not very keen on paying for academic education. the state obviously isn't very keen on that. and so the industry steps in and the industry spends one doctor education than anybody else does. so they talk about it by industry and the people who make it. i think it should be irrelevant and okay to talk about it. but we have rdc and if you get regulations and whether they actually work.
but of course when you send mystery shoppers along, a little bit line is not outright lying. and it's a really interesting thing to grasp. sort of understanding the moral framework in which all of this is happening. and then we have these hospitals. here we go. [inaudible] still up on the pill hill, i would guess that -- [laughter] there would be one clinic where there were three doctors and let's say two of them have some generic drugs that are pretty cheap. and there will be one doctor the
bases things on a reasonable mixture. on the one doctor will say it actually think that this particular one single branded drug is better. a method of action, the genuine ability of one particular drug is better. not corrupt. just past what you think. so then the sales representatives will identify that about the drug and they will give that drug a platform and a microphone. their message will be amplified to the medical community. so they will get a few hundred dollars for teaching and thousands for teaching specialists in their field. and perhaps teaching people at golf resorts and internationally in programs. now, that doctor has not changed their view for money. and the rest has just said this
is a person who generally loves our drug and we just think he is a charismatic person. we are going to shepherd him around and held him to share his message. a lot of the time you know, we are trying to be nice. so it is not necessarily that people are bad and evil. very few exceptions. it is interesting when people deny the problems. and they have a colorful denunciation of this and that. and they say oh, this is a cherry picking conspiracy as well. and you know, i draw the line when people say that these real problems shouldn't be talked about. but i think the incentives are there for line. i think the architecture has
several interesting technical flaws. and we should fix them. we should fix the vulnerability and frailties in the system. we should force people to publicly's public these clinical trials. it is arranged that we do not. and i genuinely believe that the people of the future, the doctors of the future will look back on this era of medicine and half of all clinical trials go missing in action. it's like medieval blood letters. you spent hundreds of millions of dollars on producing these carefully designed clinical trials designed to detect modest differences between one treatment and the other. and then we do that several
times over. you completely throw the baby out with the bathwater. half of them go unpublished. we knew it was a problem for decades and do nothing about it. yet we are unable to believe how stupid we are. [applause] >> i would just like to remind everyone that if you have a question, please come to this microphone. and we have 20 minutes for questions. we have a full house. we will get through as many questions as possible. thank you. >> two questions. one is the observation that somebody works in basic science. i am in biochemistry
pharmacology. i have doctor friends who produce half a dozen ministers per year with half a dozen reviews. there is this enormous proliferation of medical journals and basic science journals. every week i received two or three invitations to be some huge profit driven exclusion of thousands of new journals that are trying to get you to become part of their broward. so they can somehow profit from that model. >> the interesting is very different. the thing that i find interesting is that there is the ad hoc sort of ecosystem that
are demonstrably outdated. the volume of papers being published today clearly requires a systematic approach. and yet this has been put at an incredibly low priority. the very fact that we report the findings of these chemical experiments, they are pretty much the same. except that the outcome that they are measuring is a bit different. but almost all are the same experiment. instead of forcing people to report them in a structured data format to get going, we let people write essays like 1876. it is utterly bizarre. so the problems are just a part of a microcosm of how we just
haven't thought carefully enough about the scientific data. we need to have research about research and the building framework instead of just part of this stuff. it is bizarre. >> thank you for coming to talk, first off. >> i like your eyebrows. [laughter] >> i just relate my thought. he wanted to relay his question. >> in mid-2000 or so, there were a couple of biologists that were talking about bringing an administrator for all clinical trials and having all funding and all approvals and all results come out as opposed to
having critical trials and so on and so forth. is there any feedback on any implementation of results that we might be able to achieve in the next five years or so to have more transparency? >> the problem is that happens in academia as well. it very frequently makes results go missing in action. a lot of people have a fantastical notion of how we can fix this. like nationalizing the industry, which you weren't suggesting were making all research put into a big pot. i am not sure that we have tried real basic stuff yet.
so we have clinical trials that goes out. when you look at the trials, it is an extraordinary nonoverlapping patchwork around the world of incomplete list of trials. in the u.s. as part of the marketing approval for the last few years. so it is a unique paradox and we use this entire thing. all trials conducted and completed in europe since 2004. the agency should be running a complete list of the trials.
that is what i think the public would honestly expect us to be doing. it is laughable. it is going to be a massive political scandal. and you know, the european legislation of the same two companies, okay, well done, you have approval. and here are the forms. you can does jot this down and prop them up. it is a spectacular and bizarre absence of position that we somehow think it is okay to say
okay, these are the trials. even though we know that china and india's market are growing per year. don't worry about that. it is hopeless. so i tried fixing the current model before i turned everything upside down. you now. >> this is probably a smaller part of the. but i'm wondering if anyone reviews the reviewers. and doctor tom jefferson has anti-flu views about hazards and i have seen stuff that he has
written what he compares the analysis of the flu vaccine effectiveness and compares it to the side effects. giving him is that he pulled out. i'm just wondering if you can talk about reviewers. >> i don't know anything about the flu vaccine story. any reviewer can comment on it. and the problem that was spotted was spotted after the british and australian governments. and in a japanese doctor posted
a comment on the site, saying, i'm a bit worried here. all but two have never been published. and i think that is where people will expect to approach the literature instead of just relying on a brief summary in a systematic review that was done by someone else. and then sort of putting those numbers end. that was sloppy. we're going to go and have a look. and that is how the saga began
if you have concerns about the vaccine, maybe a few years earlier that he did. >> so post them online. regarding anything economical, post them and criticize it. if depression is who monitors will who wrote this in a and they say that any good, the answer is you. >> okay. >> afterwards, i wanted to talk about some of these things. they were very specific things.
>> our country is pretty fractured, but we do have access to quite a bit of data over quite a bit of time and a lot of different kinds of information coming together. if you were doing research on behalf of a health insurer, will kind of questions would you be asking of your work and could you steer the ship of a better outcome and what kind of changes to the way that the payment policy or pharmacy benefits would you try to affect? >> i will use my powers appear
to address market failure in addressing relevant questions on what treatment is best. for example, we have electronic health records breaks we have about 3 million people in this databank. so we run a trial comparison to see which of these two is the best of sets.
and apparently you can get an extension. systems they have been chances of dying, they have been comparing to see which is best. and to do a big child component, you have to find out which one is better. so it would be really big expensive one project because you have to get follow-up data and that is a very annoying thing about medical research. people can take ages to die. [laughter] so you need to follow people up for a long time.
so will we know which is the best? so we randomly assigned patients to one another and you'll never have to think about it. so doctors get involved because they won't come back for blood testing. and so here, they are not. you're comparing two things. stockings have been shown to be safer to many people in the world. and you get follow-up data for the stockings. and you can see in the health records that they have a health episode. you can see they die. you get the cause of death. we could turn our health systems
that were constantly tested on learning and adapting, routine comparisons -- i'm not composing this with what we don't know works best. although maybe we could care what we can make out. because nobody has any idea which is better. all we have to do is persuade the population and we have to find a way of having doctors who have the humility to stay for their patience. and that is also requiring that we hold the line. so they say, those guys are fools, know which one is for,
then that doctor blows the whole game. when i say it is broken, you know, another example is how this works. so you have all this data and then she states that any new treatment that has just been approved is really expensive and there is not currently any data to show that is better than this or that or the available treatment. you can only get it by a trial. you can do that and say, you know, if you want to get this treatment, we don't know if it works, we actually don't know if it works. the right thing to do is to minimize the amount of time
during which arbitrarily and some people will be given a treatment which will subsequently turn out to be less effective. like let's find out as soon as possible without harming anybody. we do that? [laughter] >> so we can count on having a member for a few years, maybe. making health care less expensive for everybody doesn't actually make make it possible. >> how about you gang up together. you could compete on other aspects in the collaborate with data. >> what i was asking was more like are their report cards for pharmaceutical manufacturers as far as things go?
>> yes. >> can we talk about how dodgy they are? >> yes. we are running out of time, but i will do my best answer. >> we do have to move on to is another issue. but i want to remind everyone that will be available for book signing just on the table after the talk. >> book came out and a gp that is now an mp. and this is what happens, then we went to this and that and we agree that the trial data is a problem. and they said this is a problem and then they have a special inquiry and then we decided we
needed a proper campaign. and anyone can sign, it is not 30,000 signatures and it is now increased. and we have 80 patient groups. to make them sign a petition. and lastly, jfk, one of the best companies in the world was included. and in answer to your question, we are known to answer this with their are two treatments. we don't know which is best. one of them is saying they are made by a company that is not willing to share the data. doctors and payers might say, one of these people haven't made a commitment and i will use this
one. because that gives money for transparency. thank you very much. >> is there a nonfiction author or book that you'd like to see featured on the tv? send us an e-mail at booktv@c-span.org or tweet us at twitter.com/booktv. >> other generations are saying, how do we adapt? i remove? how we go forward in this fast-paced world. the biennials are taking it all in stride. because that is the reality of how we grew up. it has also brought us the ability to be resilient in an economic crisis which has now led to incredible dapper young people. many are optimistic about their long-term economic